5 - PDE based Image Processing [ID:42274]
50 von 776 angezeigt

I think we can start.

So, last time we have discussed the issues that you get when you smooth an image

also got a heat equationcentered by the same thing as we got in the discrete set up,for

he would have something like

this case the time step and h is

the space we have something like this

sequence for the Heubler scheme.

When U发 k denotes Uf x at k and

Time step by the age of the pixel size, in what dimensions it stands at least.

And we've also seen if we have drugs less or equal, A squared divided by two,

then this is a convex combination. So it's sana of this form Ua k plus 1

W i j uk.

And here we have j in a neighborhood of i.

So the neighborhood in 1D would just be the left

right neighbor and the point itself. We could of course generalize that to go to

larger neighborhoods or to go also to 2D, 3D, where we have for example these

neighboring points as well. The essential thing is, of course, that the weights are

are non-negative and the sum of them.

If you sum over J, is equal to one.

Okay.

And we also add this kind of convolution structure

we also had for the solution of the heat equation.

So actually we had in this case.

They are only depending on the difference

I minus J, even on the absolute value so it was left and right symmetric.

We've seen this is good to eliminate noise, but it has problems on the edges.

and so what we said

would be one option is to check W ij not arbitrary but as a function in a sense on gradient u

maybe in the neighborhood of i or even a larger domain or on

some kind of smooth version, for example with the Gaussian,

in order to avoid the noise here and still check if you have a large gradient,

We have

an image like this and gradient of U where you think of a clean image like this

Would be large, more or less everywhere but this is gradient

of this. So the gradient can be exchanged with the convolution you can

easily check.

So if you compute this smooth function will look like this, probably more...

And then we see we have a

large gradient here and we have small gradients away from the edges. So the

small noise small oscillations essential are ones that

gradient of course from the edge is still there in the sense it's a bit

smaller but still this is by far the largest gr retour here.

So how can we incorporate that let's try everything in $1 WS buy

we write

Mass. So, it's always U ik and then we write tau divided by h squared,

a plus 1 K, minus ui k minus Tau divided by a squared ui minus 1 minus uiK.

Okay这就是一个投asta和

No, like this is a plus or

we have two different differences here

and we would like to have now weights.

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:36:23 Min

Aufnahmedatum

2022-05-24

Hochgeladen am

2022-05-24 15:19:07

Sprache

de-DE

Einbetten
Wordpress FAU Plugin
iFrame
Teilen